Goto

Collaborating Authors

 creatinine level


TransformerLSR: Attentive Joint Model of Longitudinal Data, Survival, and Recurrent Events with Concurrent Latent Structure

Zhang, Zhiyue, Zhao, Yao, Xu, Yanxun

arXiv.org Machine Learning

In applications such as biomedical studies, epidemiology, and social sciences, recurrent events often co-occur with longitudinal measurements and a terminal event, such as death. Therefore, jointly modeling longitudinal measurements, recurrent events, and survival data while accounting for their dependencies is critical. While joint models for the three components exist in statistical literature, many of these approaches are limited by heavy parametric assumptions and scalability issues. Recently, incorporating deep learning techniques into joint modeling has shown promising results. However, current methods only address joint modeling of longitudinal measurements at regularly-spaced observation times and survival events, neglecting recurrent events. In this paper, we develop TransformerLSR, a flexible transformer-based deep modeling and inference framework to jointly model all three components simultaneously. TransformerLSR integrates deep temporal point processes into the joint modeling framework, treating recurrent and terminal events as two competing processes dependent on past longitudinal measurements and recurrent event times. Additionally, TransformerLSR introduces a novel trajectory representation and model architecture to potentially incorporate a priori knowledge of known latent structures among concurrent longitudinal variables. We demonstrate the effectiveness and necessity of TransformerLSR through simulation studies and analyzing a real-world medical dataset on patients after kidney transplantation.


MEDIMP: 3D Medical Images with clinical Prompts from limited tabular data for renal transplantation

Milecki, Leo, Kalogeiton, Vicky, Bodard, Sylvain, Anglicheau, Dany, Correas, Jean-Michel, Timsit, Marc-Olivier, Vakalopoulou, Maria

arXiv.org Artificial Intelligence

Renal transplantation emerges as the most effective solution for end-stage renal disease. Occurring from complex causes, a substantial risk of transplant chronic dysfunction persists and may lead to graft loss. Medical imaging plays a substantial role in renal transplant monitoring in clinical practice. However, graft supervision is multi-disciplinary, notably joining nephrology, urology, and radiology, while identifying robust biomarkers from such high-dimensional and complex data for prognosis is challenging. In this work, taking inspiration from the recent success of Large Language Models (LLMs), we propose MEDIMP -- Medical Images with clinical Prompts -- a model to learn meaningful multi-modal representations of renal transplant Dynamic Contrast-Enhanced Magnetic Resonance Imaging (DCE MRI) by incorporating structural clinicobiological data after translating them into text prompts. MEDIMP is based on contrastive learning from joint text-image paired embeddings to perform this challenging task. Moreover, we propose a framework that generates medical prompts using automatic textual data augmentations from LLMs. Our goal is to learn meaningful manifolds of renal transplant DCE MRI, interesting for the prognosis of the transplant or patient status (2, 3, and 4 years after the transplant), fully exploiting the limited available multi-modal data most efficiently. Extensive experiments and comparisons with other renal transplant representation learning methods with limited data prove the effectiveness of MEDIMP in a relevant clinical setting, giving new directions toward medical prompts. Our code is available at https://github.com/leomlck/MEDIMP.


Randomized Trial of "Corollary Orders" to Prevent Errors of Omission Journal of the American Medical Informatics Association

AITopics Original Links

Objective: Errors of omission are a common cause of systems failures. Physicians often fail to order tests or treatments needed to monitor/ameliorate the effects of other tests or treatments. The authors hypothesized that automated, guideline-based reminders to physicians, provided as they wrote orders, could reduce these omissions. Design: The study was performed on the inpatient general medicine ward of a public teaching hospital. Faculty and housestaff from the Indiana University School of Medicine, who used computer workstations to write orders, were randomized to intervention and control groups. As intervention physicians wrote orders for 1 of 87 selected tests or treatments, the computer suggested corollary orders needed to detect or ameliorate adverse reactions to the trigger orders. The physicians could accept or reject these suggestions. Results: During the 6-month trial, reminders about corollary orders were presented to 48 intervention physicians and withheld from 41 control physicians. Intervention physicians ordered the suggested corollary orders in 46.3% of instances when they received a reminder, compared with 21.9% compliance by control physicians (p 0.0001). Physicians discriminated in their acceptance of suggested orders, readily accepting some while rejecting others. There were one third fewer interventions initiated by pharmacists with physicians in the intervention than control groups. Conclusion: This study demonstrates that physician workstations, linked to a comprehensive electronic medical record, can be an efficient means for decreasing errors of omissions and improving adherence to practice guidelines.